Learning Negative Mixture Models by Tensor Decompositions
نویسندگان
چکیده
This work considers the problem of estimating the parameters of negative mixture models, i.e. mixture models that possibly involve negative weights. The contributions of this paper are as follows. (i) We show that every rational probability distributions on strings, a representation which occurs naturally in spectral learning, can be computed by a negative mixture of at most two probabilistic automata (or HMMs). (ii) We propose a method to estimate the parameters of negative mixture models having a specific tensor structure in their low order observable moments. Building upon a recent paper on tensor decompositions for learning latent variable models, we extend this work to the broader setting of tensors having a symmetric decomposition with positive and negative weights. We introduce a generalization of the tensor power method for complex valued tensors, and establish theoretical convergence guarantees. (iii) We show how our approach applies to negative Gaussian mixture models, for which we provide some experiments.
منابع مشابه
Tensor Decompositions for Learning Latent Variable Models Report Title
This work considers a computationally and statistically e?cient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable moments (typically, of secondand third-order). Speci?cally, parameter estimation is reduced to the pro...
متن کاملTensor decompositions for learning latent variable models
This work considers a computationally and statistically efficient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their low-order observable moments (typically, of secondand third-order). Specifically, parameter estimation is reduced to the...
متن کاملIntroduction to Tensor Decompositions and their Applications in Machine Learning
Tensors are multidimensional arrays of numerical values and therefore generalize matrices to multiple dimensions. While tensors rst emerged in the psychometrics community in the 20th century, they have since then spread to numerous other disciplines, including machine learning. Tensors and their decompositions are especially bene cial in unsupervised learning settings, but are gaining popularit...
متن کاملUniqueness of Tensor Decompositions with Applications to Polynomial Identifiability
We give a robust version of the celebrated result of Kruskal on the uniqueness of tensor decompositions: we prove that given a tensor whose decomposition satisfies a robust form of Kruskal’s rank condition, it is possible to approximately recover the decomposition if the tensor is known up to a sufficiently small (inverse polynomial) error. Kruskal’s theorem has found many applications in provi...
متن کاملEfficient Orthogonal Tensor Decomposition, with an Application to Latent Variable Model Learning
Decomposing tensors into orthogonal factors is a well-known task in statistics, machine learning, and signal processing. We study orthogonal outer product decompositions where the factors in the summands in the decomposition are required to be orthogonal across summands, by relating this orthogonal decomposition to the singular value decompositions of the flattenings. We show that it is a non-t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1403.4224 شماره
صفحات -
تاریخ انتشار 2014